27 research outputs found
Nano-artifact metrics based on random collapse of resist
Artifact metrics is an information security technology that uses the
intrinsic characteristics of a physical object for authentication and clone
resistance. Here, we demonstrate nano-artifact metrics based on silicon
nanostructures formed via an array of resist pillars that randomly collapse
when exposed to electron-beam lithography. The proposed technique uses
conventional and scalable lithography processes, and because of the random
collapse of resist, the resultant structure has extremely fine-scale morphology
with a minimum dimension below 10 nm, which is less than the resolution of
current lithography capabilities. By evaluating false match, false non-match
and clone-resistance rates, we clarify that the nanostructured patterns based
on resist collapse satisfy the requirements for high-performance security
applications
Unidirectional light propagation through two-layer nanostructures based on optical near-field interactions
We theoretically demonstrate direction-dependent polarization conversion
efficiency, yielding unidirectional light transmission, through a two-layer
nanostructure by using the angular spectrum representation of optical
near-fields. The theory provides results that are consistent with
electromagnetic numerical simulations. This study reveals that optical
near-field interactions among nanostructured matter can provide unique optical
properties, such as the unidirectionality observed here, and offers fundamental
guiding principles for understanding and engineering nanostructures for
realizing novel functionalities
Programming Abstractions for Data Locality
The goal of the workshop and this report is to identify common themes and standardize concepts for locality-preserving abstractions for exascale programming models. Current software tools are built on the premise that computing is the most expensive component, we are rapidly moving to an era that computing is cheap and massively parallel while data movement dominates energy and performance costs. In order to respond to exascale systems (the next generation of high performance computing systems), the scientific computing community needs to refactor their applications to align with the emerging data-centric paradigm. Our applications must be evolved to express information about data locality. Unfortunately current programming environments offer few ways to do so. They ignore the incurred cost of communication and simply rely on the hardware cache coherency to virtualize data movement. With the increasing importance of task-level parallelism on future systems, task models have to support constructs that express data locality and affinity. At the system level, communication libraries implicitly assume all the processing elements are equidistant to each other. In order to take advantage of emerging technologies, application developers need a set of programming abstractions to describe data locality for the new computing ecosystem. The new programming paradigm should be more data centric and allow to describe how to decompose and how to layout data in the memory.Fortunately, there are many emerging concepts such as constructs for tiling, data layout, array views, task and thread affinity, and topology aware communication libraries for managing data locality. There is an opportunity to identify commonalities in strategy to enable us to combine the best of these concepts to develop a comprehensive approach to expressing and managing data locality on exascale programming systems. These programming model abstractions can expose crucial information about data locality to the compiler and runtime system to enable performance-portable code. The research question is to identify the right level of abstraction, which includes techniques that range from template libraries all the way to completely new languages to achieve this goal
Nano-Photonic Metrics: Fundamentals and Experimental Demonstration
As the popularity of Internet of Things (IoT) increases, there is a considerable demand for the improvement of physical security, owing to the increase in edge devices. However, fabrication and measurement techniques used by attackers are also improving continuously, and hence, it is becoming increasingly difficult to ensure the security of each device using conventional approaches. To counter variable attacks in this context, the concept of nano-photonic metrics has been proposed, which is based on a functional collaboration between existing physical security and near-field optical techniques. In this approach, the optical signals obtained from optical near-field interactions, which are induced between the target with nano-scale structures and the tip of the scanning probe as the reader, are defined as the unique features of each device to be authenticated. When attackers attempt spoofing, they must fabricate not only clones of original nano-scale structures but also the scanning probe; otherwise, they cannot impersonate regular users. Moreover, the estimation of the nano-scale structures of the target and the characteristics of the probe is typically a complex, inverse problem. Therefore, a novel authentication is expected to be performed. In this paper, we report the results of the quantitative evaluations of the performance from the viewpoint of physical security and the experimental verification of the practicality of the proposed approach
Nano-Photonic Metrics: Fundamentals and Experimental Demonstration
As the popularity of Internet of Things (IoT) increases, there is a considerable demand for the improvement of physical security, owing to the increase in edge devices. However, fabrication and measurement techniques used by attackers are also improving continuously, and hence, it is becoming increasingly difficult to ensure the security of each device using conventional approaches. To counter variable attacks in this context, the concept of nano-photonic metrics has been proposed, which is based on a functional collaboration between existing physical security and near-field optical techniques. In this approach, the optical signals obtained from optical near-field interactions, which are induced between the target with nano-scale structures and the tip of the scanning probe as the reader, are defined as the unique features of each device to be authenticated. When attackers attempt spoofing, they must fabricate not only clones of original nano-scale structures but also the scanning probe; otherwise, they cannot impersonate regular users. Moreover, the estimation of the nano-scale structures of the target and the characteristics of the probe is typically a complex, inverse problem. Therefore, a novel authentication is expected to be performed. In this paper, we report the results of the quantitative evaluations of the performance from the viewpoint of physical security and the experimental verification of the practicality of the proposed approach